Market Roundup

August 6, 2004

Major Server Vendors Make Major Commitments to “Take Two” of Intel’s 64‑bit Computing Direction

Back to the Shadows Again

Novell and SuSE Pop Up Linux 2.6 Kernel

Get Along and Go Along: IBM and Mayo Clinic Expand Medical Collaboration

 Sun Tackles the Silly Season

 


Major Server Vendors Make Major Commitments to “Take Two” of Intel’s 64‑bit Computing Direction

By AJ Dennis

Hewlett-Packard, IBM, and Dell have each announced availability for new and upgraded servers based on Intel's new Xeon processor with 64-bit extensions, joining the 64-bit x86-compatible market developed and driven by AMD. In addition to the 64-bit CPUs, the servers have a number of new technologies in common, including DDR2 memory, 800MHz front-side bus, and PCI Express I/O. HP will start shipping two two-way Nacona-based ProLiant tower servers with 3GHz and 3.4GHz processors this week. IBM is also introducing a two-way blade server and its one-way servers this week while its announced two-way servers are expected to be available late this month. Trailing these announced offerings, Dell expects to release a 1U rack-mount server and a pedestal server based on EM64T technology in October, with blade versions of the servers later in the fall.

Intel’s Extended Memory 64-bit Technology (EM64T) started out with the codename Yamhill. After several years of denying this project existed, Intel eventually acknowledged the effort in 2003 and in early 2004 announced an implementation of the AMD64 architecture, an extension to the IA-32 instruction which adds 64-bit extensions to the x86 architecture. In delivering this implementation of the Xeon platform, Intel’s x86 standard has made a significant leap, in many ways obsolescing its predecessor, the Xeon DP. EM64T is manufactured using Intel’s "Prescott" core with a 90-nanometer process rather than the earlier Xeon DP’s 130-nanometer process, giving Intel more real estate for more transistors at lower power, and doubling both the L1 and L2 caches for improved Xeon-optimized performance.

EM64T’s journey to market is a long and convoluted one, complicated with Intel’s internal politics and protectionism, as Intel did not want to give the market mixed signals about the future viability of its Itanium IA-64 processors. However, Intel was forced to respond to the early success of the AMD64 technology. With Intel’s Itanium IA-64 processor efforts as yet “a bridge too far,” Intel made market-driven moves that are both incisive and insightful. Effectively adopting the AMD64 specifications, Intel has swallowed hard and lowered the threshold to avoid competitively-driven complexity in hopes of building confidence in an IA-32/64-bit “industry standard.” While Intel’s approach is remarkably market savvy, there are technology “incompletes” that reflect Intel’s speedy course corrections. For instance, EM64T accesses memory above 4GB by copying data from wherever it is to a fixed location below the 4GB line. Also, the EM64T’s PCI Express implementation has been revealed to be a “work in progress” and CPU-level communication with memory and between CPUs is held back by existing systems-level architecture that Intel has yet to address. What Intel has addressed is their need to be in the IA-32/64 bit marketplace… immediately. With the announced commitments this week from these major server vendors, Intel is correctly leading with their formidable marketing might and while customers catch-on with 64-bit, Intel technology will surely play catch-up.

Back to the Shadows Again

By Jim Balderston

IBM has announced that it was contributing its Cloudscape Java-based database valued at $85 million to the Open Source Community. IBM acquired the database when it bought Informix, which had paid $85 million for the company that originally designed the database, Cloudscape. IBM will turn over the Cloudscape product to the Apache Software Foundation, which will rename the offering Derby. The database is not designed to compete against those made by Oracle, IBM, or Microsoft, but instead is designed to be incorporated within Java-based applications like small Web sites or point-of-sale systems.

About a zillion years ago, when the Internet and Java were VBD (very big deals), Sun Microsystems held a Java-themed event at Fort Mason Center in San Francisco. A similar Java pep rally the year before was not as well attended and featured as its highest-profile non-Sun company that darling of the Internet, Netscape. On this relatively sunny day in San Francisco, however, it was not Netscape that took second billing on stage with Scott McNeeley; it was Lou Gerstner of IBM. At that moment, an air of legitimacy surrounded the Java development environment. IBM’s commitment to Java served as a foundation for Java development in the following years, much more than Netscape ever did or could. Call this Day One of Java’s life.

Fast-forward eight years and we still see IBM as one of the prime movers of Java. With this latest announcement, IBM continues its strategy of seeding the marketplace with technology it supports with the hope that broader market adoption will lead to greater use of IBM products and services in the future. IBM tosses out this database —something it can call surplus — and provides a valuable piece of technology to others building applications that could and may well end up running on IBM’s WebSphere server. Contrast this approach to the one taken by Sun, in which the company attempted to maintain strict and total control over Java’s development directions, with the hope of reaping the benefits of controlling the world of Java and gaining license fees in the bargain. In our minds, IBM’s approach not only does more to benefit the market for Java-related products than Sun’s actions, it does more for IBM than Sun’s actions do or did for Sun. When tabulating and contrasting the impacts these companies are having on the world of Java development, one has to say without hesitation that Sun now resides in the shadow of IBM in this regard, something very few people would have predicted on that clear day in San Francisco so many years ago. Except, of course, us (in our Zona persona).

Novell and SuSE Pop Up Linux 2.6 Kernel

By Charles King and Jim Balderston

Novell made a series of announcements at LinuxWorld concerning new partnerships and product developments surrounding the release of SuSE Linux Enterprise Server 9. The new version is based on the Linux 2.6 kernel, which offers greater scalability, supporting servers with up to sixteen processors. According to Novell, SuSE Linux Enterprise Server 9 offers enhanced management tools and security, and is designed for the needs of enterprise workgroups and datacenters. In a related announcement, Novell indicated that it will expand its ongoing relationship with JBoss. Going forward Novell will ship the JBoss application server as a bundle with the SuSE Linux Enterprise Server and Novell exteNd. According to Novell, the JBoss application server is the first Open Source application server to attain J2EE 1.4 compliance and the company indicated that that next version of exteNd will include the JBoss application server 4.x in place of the Novell exteNd application server. That product is due out in late 2005. The SuSE Linux Enterprise Server 9 is available, and supports hardware architectures including x86, AMD64TM and Intel EM64T, Itanium, POWER, IBM eServer zSeries, and IBM S390. A yearly maintenance service subscription for x86 architectures starts at $349 U.S. per server with two CPUs. Current users of the SuSE Linux Enterprise Server with active upgrade protection/maintenance will be able to download version 9 for no charge.

These announcements offer an interesting perspective on the dynamic confluence between Novell and Open Source. Though doubters feigned confusion over the company’s acquisition of SuSE and its embrace of Linux, the SuSE Linux Enterprise Server 9 suggests that the company was looking further down the road than many of its critics and competitors. The fact is that the arrival of the 2.6 kernel significantly advances Linux’s position on the enterprise track. By taking advantage of the 2.6 kernel’s inherent capabilities, and adding its own extensive enterprise class experience and solutions, Novell has taken a big step ahead of many competitors. Though its rivals will be offering products based on the 2.6 kernel soon enough, Novell is showing that it is ready, willing, and able to deliver products well ahead of the curve. With the ongoing development of SuSE Linux and its broad endorsement from many IT vendors including AMD, HP, CA, IBM, Intel, and Oracle, Novell is proving to the market that the acquisition of SuSE makes both good business and innovative sense.

From a higher vantage point, these announcements illustrate the confluence and importance of the merging of Linux and J2EE technologies as an increasingly concrete de facto industry standard for application development and integration. The increasing scalability of SuSE Linux products puts more distance between the Linux/J2EE crowd and those working within the .NET development environment, especially in the high-stakes high-end enterprise environment. The leverage of a high-performance application server such as JBoss offers its partners is a critical issue here, and is reflected both in Novell’s willingness to replace its own exteNd application server with JBoss 4x and in RedHat announcing its own native application server earlier this week. However, since RedHat’s offering has not attained J2EE 1.4 compliance and lacks the comprehensive toolkit of enterprise class infrastructure solution afforded by Novell, it appears that Novell/SuSE is well positioned to be the vendor of choice for businesses looking for a true enterprise class Linux.

Get Along and Go Along: IBM and Mayo Clinic Expand Medical Collaboration

By Charles King

IBM and the Mayo Clinic have announced the next steps of a broad collaboration aimed at using IT to enhance medical efforts from essential patient care to genomic and proteomic research. The efforts include plans to map current and historical Mayo patient data, and to link them with new classes of medical information via IT solutions. IBM and Mayo announced the completion of the first step of their collaboration (which kicked off in May 2001), the integration of 4.4 million medical records including test results, links to medical image data, and patient demographic information, into a unified system. The organizations now plan to develop data mining tools that will allow physicians to work with this information to develop more effective treatments. In addition, the Mayo Clinic will utilize IBM’s Deep Computing capabilities, including Blue Gene supercomputers, for work in genomic and proteomic research, and in molecular modeling, with the goal of better understanding disease causes, prevention, and treatment. Finally, IBM and Mayo are collaborating on a number of pervasive devices and data mining tools that will allow physicians to access customized patient data “on demand,” enhancing medical diagnosis and treatment. IBM and the Mayo Clinic said both organizations will make significant investments in people and technology in these efforts, and that all projects will incorporate advanced data protection and security features to ensure patient privacy.

The “paperless” office was once a common buzzword among IT vendors (save those in the printer business, of course), but continuing IT advances have increased the criticality of finding effective means for dealing with ever-increasing amounts of data. This is an especially crucial issue in medicine, where information overload infects every level of medical study and practice. Most importantly, without the means of efficiently dealing with gluts of information, advances in medical research are less likely to influence patient treatment. However, how IT vendors help doctors and research scientists come to grips with these issues requires particular care. While the typical vendor practice of partnering with ISVs to develop and deliver applications works well enough for practical, highly targeted solutions, it is less effective for dealing with more complex problems. This is where IBM and the Mayo Clinic’s collaboration comes into play. Both organizations bring sophisticated, even exclusive strengths to the table. By working closely together, the two can potentially create a whole whose sum is greater than its parts.

Both IBM and Mayo stand to profit from this effort. For the Mayo Clinic, the collaboration provides a means for practically leveraging an overwhelming amount of historic patient information to both ensure the effectiveness of current methodologies and develop new classes of treatments. Further, integrating information gleaned from genomic, proteomic, and molecular research should significantly enhance clinical treatment and overall patient health. For IBM, the partnership provides both strategic and tactical benefits. To begin, it allows the company to be associated with one of the nation’s preeminent medical institutions, and to further enhance its standing as a key driver in improving patient care and health through technology, an effort IBM is also pursuing via partnerships with institutions including Duke University, Fred Hutchinson Cancer Research Center, Johns Hopkins University, H. Lee Moffitt Cancer Institute, and the University of California San Francisco. More practically, by taking on a range of highly technical efforts such as these, IBM is crafting a base for a host of commercial medical solutions. For many vendors, simple necessity requires them to start small and build their way up to more complex solutions. IBM’s size, skills, and experience allow the company to pursue a significantly different path that offers tangible benefits to the company, its partners, and eventually, medical patients of every kind.

 Sun Tackles the Silly Season

By Joyce Tompsett Becknell

This week Sun’s president Jonathan Schwartz publicly toyed with the idea of Sun purchasing Novell to the amusement and annoyance of the press. Not to miss out on the opportunity to give reporters something to do between political conventions and the Olympics, Schwartz floated yet another test balloon out into the world. The press has responded with a barrage of fodder of their own.

As we’ve indicated before, we used to be fans of Sun. That was back in the days when they were serious about computing and spent their time arguing architectural philosophies. Now they look dangerously to us as though they might be more interested in following SCO’s approach to Linux: ridiculing the community and causing dissention with IBM or HP, who are seriously working with the Linux community. Sun has spent most of the last year playing sleight of hand with Linux. At one point, Scott even dressed up like a penguin (a serous sartorial error even for him) and then they boasted that Solaris was still far superior to Linux and customers should not choose Linux. One is no longer sure where they stand on the issue or why. Even Dell — the ultimate technology opportunist — articulates a clearer position.

What is clear is that in making peace with Microsoft, Sun has decided that IBM is more than ever the enemy to be attacked in the press. Schwartz seems to think that purchasing Novell could be a good idea for Sun and a bad idea for IBM. However, based on Sun’s past handling of Linux, we don’t believe that Linux in Sun’s hands is a good idea for anybody. From a Sun perspective, the purchase of Novell would leave Sun with an awful lot of software that competes with existing Sun products. This would not be a well thought out move on Sun’s part, as their software strategy needs focusing and not diluting. For the user community this is an even uglier prospect. Like most others who follow Linux, we don’t believe Sun is seriously interested in purchasing a Linux distribution. They are interested in selling Solaris, which of course is the problem. Novell has produced a steady stream of announcements for SuSE and its other products, working to integrate NetWare, the Ximian products, and SuSE to create true enterprise-ready Linux for the enterprise. However, despite their technical achievements, serious business impediments remain. This week, the city of Munich put its huge Linux migration campaign on hold. The reasons weren’t technical. The Germans are concerned about potential patent issues and the legal and financial risks of deploying government systems on Linux. This could have a huge impact as governments around the world, and particularly in Europe, have been watching Munich and making similar decisions about moving to Linux. Public trial balloons intended to create instability in the market undermine the industry’s ability to steer customers soberly toward solid business decisions involving Linux and open source applications. Sageza hopes Sun has exhausted its test balloon supply for now and will return to the serious issues of re-inflating their own corporate Hindenburg.


The Sageza Group, Inc.

32108 Alvarado Blvd #354

Union City, CA 94587

650·390·0700     fax 650·649·2302

London +44 (0) 20·7900·2819

Milan +39 02·9544·1646

 

sageza.com

 

Copyright © 2004 The Sageza Group, Inc. May not be duplicated or retransmitted without written permission.